hadoop fs -ls s3://bucket or s3a://bucket throws "No such file or directory" error
NickName:dz902 Ask DateTime:2021-06-21T15:07:51

hadoop fs -ls s3://bucket or s3a://bucket throws "No such file or directory" error

In a newly created EMR cluster, using:

  • hdfs dfs -ls s3://bucket
  • hadoop fs -ls s3://bucket
  • hadoop fs -ls s3a://
  • etc.

...all return the error:

"ls: `s3://bucket': No such file or directory"

  • EMR instance profile has full S3 access
  • Nothing specified in core-site.xml
  • aws s3 ls can correctly list all buckets

Why does this happen?

Copyright Notice:Content Author:「dz902」,Reproduced under the CC 4.0 BY-SA copyright license with a link to the original source and this disclaimer.
Link to original article:https://stackoverflow.com/questions/68063712/hadoop-fs-ls-s3-bucket-or-s3a-bucket-throws-no-such-file-or-directory-err

More about “hadoop fs -ls s3://bucket or s3a://bucket throws "No such file or directory" error” related questions

hadoop fs -ls s3://bucket or s3a://bucket throws "No such file or directory" error

In a newly created EMR cluster, using: hdfs dfs -ls s3://bucket hadoop fs -ls s3://bucket hadoop fs -ls s3a:// etc. ...all return the error: "ls: `s3://bucket': No such file or directory&quo...

Show Detail

Hadoop S3 List Bucket Contents Error

Not sure what I'm missing here. What's the best way to troubleshoot the below message I get when trying bucket contents from s3 using hadoop 2.7.1. This should be pretty straight forward where I ha...

Show Detail

How do I get Hive 2.2.1 to successfully integrate with AWS S3 using "s3a://" scheme

I've followed various published documentation on integrating Apache Hive 2.1.1 with AWS S3 using the s3a:// scheme, configuring fs.s3a.access.key and fs.s3a.secret.key for hadoop/etc/hadoop/core-s...

Show Detail

access amazon S3 bucket from hadoop specifying SecretAccessKey from command line

I am trying to access amazon S3 bucket using hdfs command. Here is command that I run: $ hadoop fs -ls s3n://<ACCESSKEYID>:<SecretAccessKey>@<bucket-name>/tpt_files/ -ls: Invalid

Show Detail

Trying to connect to a S3 Bucket with Hadoop FileSystem but jumps an error saying the bucket is in this region

I'm trying to use the Hadoop API to connect to an Amazon S3 bucket. I'm using the org.apache.hadoop.fs.FileSystem class in order to connect to the bucket and create a new file. import org.apache.h...

Show Detail

Not able to access s3 from ec2 instance using "hadoop fs -ls s3a://bucketname" command

I have installed Hadoop in psuedo-distributed mode on one ec2 instance. I have hdfs and yarn running. I have assigned s3fullaccess role to my ec2 instance and able to access s3 using "aws s3 ls" co...

Show Detail

S3 bucket name gets added to the S3 endpoint

When trying to access a file from a S3 bucket using Scala, the bucket name gets added to the front of endpoint and the endpoint becomes wrong and in-accessible. Libraries: libraryDependencies +=...

Show Detail

Retrieving Latest Object Version from s3a bucket

Is there any way to get the latest object version from an s3 versioning enabled bucket, using s3a connector, with the help of hadoop cli?

Show Detail

How do I Create Hive External table on top of ECS S3 object storage using "S3a//" protocol

I am trying to create Hive external table using Beeline on top of S3 object storage using "S3a//" scheme.I have followed the official cloudera documentation and configured the below prope...

Show Detail

Cannot access S3 bucket with Hadoop

I'm trying to access my S3 bucket with Hadoop (2.7.3) and I'm getting the following ubuntu@AWS:~/Prototype/hadoop$ ubuntu@AWS:~/Prototype/hadoop$ bin/hadoop fs -ls s3://[bucket]/ 17/03/2...

Show Detail